Projection Pursuit Through Relative Entropy Minimization
نویسنده
چکیده
Consider a de ned density on a set of very large dimension. It is quite di cult to nd an estimate of this density from a data set. However, it is possible through a projection pursuit methodology to solve this problem. In his seminal article, Huber (see "Projection pursuit", Annals of Statistics, 1985) demonstrates the interest of his method in a very simple given case. He considers the factorization of density through a Gaussian component and some residual density. Huber's work is based on maximizing relative entropy. Our proposal leads to a new algorithm. Furthermore, we consider the case when the density to be factorized is estimated from an i.i.d. sample. In this case, we will propose a test for the factorization of the estimated density.
منابع مشابه
Minimization Problems Based on a Parametric Family of Relative Entropies I: Forward Projection
Minimization problems with respect to a one-parameter family of generalized relative entropies are studied. These relative entropies, which we term relative α-entropies (denoted Iα), arise as redundancies under mismatched compression when cumulants of compressed lengths are considered instead of expected compressed lengths. These parametric relative entropies are a generalization of the usual r...
متن کاملRelative α-entropy minimizers subject to linear statistical constraints
We study minimization of a parametric family of relative entropies, termed relative α-entropies (denoted Iα(P,Q)). These arise as redundancies under mismatched compression when cumulants of compressed lengths are considered instead of expected compressed lengths. These parametric relative entropies are a generalization of the usual relative entropy (KullbackLeibler divergence). Just like relati...
متن کاملRelative $\alpha$-Entropy Minimizers Subject to Linear Statistical Constraints
We study minimization of a parametric family of relative entropies, termed relative α-entropies (denoted Iα(P,Q)). These arise as redundancies under mismatched compression when cumulants of compressed lengths are considered instead of expected compressed lengths. These parametric relative entropies are a generalization of the usual relative entropy (KullbackLeibler divergence). Just like relati...
متن کاملNew Approximations of Diierential Entropy for Independent Component Analysis and Projection Pursuit New Approximations of Diierential Entropy for Independent Component Analysis and Projection Pursuit New Approximations of Diierential Entropy for Independent Component Analysis and Projection Pursuit
We derive a rst-order approximation of the density of maximum entropy for a continuous 1-D random variable, given a number of simple constraints. This results in a density expansion which is somewhat similar to the classical polynomial density expansions by Gram-Charlier and Edgeworth. Using this approximation of density, an approximation of 1-D diierential entropy is derived. The approximation...
متن کاملMinimization Problems Based on Relative $\alpha$-Entropy II: Reverse Projection
In part I of this two-part work, certain minimization problems based on a parametric family of relative entropies (denoted Iα) were studied. Such minimizers were called forward Iα-projections. Here, a complementary class of minimization problems leading to the so-called reverse Iα-projections are studied. Reverse Iα-projections, particularly on log-convex or power-law families, are of interest ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Communications in Statistics - Simulation and Computation
دوره 40 شماره
صفحات -
تاریخ انتشار 2011